Sharp Bounds on the Approximation Rates, Metric Entropy, and n-Widths of Shallow Neural Networks

نویسندگان

چکیده

In this article, we study approximation properties of the variation spaces corresponding to shallow neural networks with a variety activation functions. We introduce two main tools for estimating metric entropy, rates, and n-widths these spaces. First, notion smoothly parameterized dictionary give upper bounds on nonlinear their absolute convex hull. The depend upon order smoothness parameterization. This result is applied dictionaries ridge functions networks, they improve existing results in many cases. Next, provide method lower bounding entropy which contain certain classes gives sharp $$L^2$$ -approximation range important functions, including ReLU $$^k$$ sigmoidal bounded variation.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bounds on rates of variable-basis and neural-network approximation

Tightness of bounds on rates of approximation by feedforward neural networks is investigated in a more general context of nonlinear approximation by variable-basis functions. Tight bounds on the worst case error in approximation by linear combinations of elements of an orthonormal variable basis are derived.

متن کامل

Metric entropy, -widths, and sampling of functions on manifolds

We first investigate on the asymptotics of the Kolmogorov metric entropy and nonlinear n-widths of approximation spaces on some function classes on manifolds and quasi-metric measure spaces. Secondly, we develop constructive algorithms to represent those functions within a prescribed accuracy. The constructions can be based on either spectral information or scattered samples of the target funct...

متن کامل

Sharp Bounds on the PI Spectral Radius

In this paper some upper and lower bounds for the greatest eigenvalues of the PI and vertex PI matrices of a graph G are obtained. Those graphs for which these bounds are best possible are characterized.

متن کامل

Error Bounds for Approximation with Neural Networks

In this paper we prove convergence rates for the problem of approximating functions f by neural networks and similar constructions. We show that the rates are the better the smoother the activation functions are, provided that f satisses an integral representation. We give error bounds not only in Hilbert spaces but in general Sobolev spaces W m;r ((). Finally, we apply our results to a class o...

متن کامل

Geometric Rates of Approximation by Neural Networks

Model complexity of feedforward neural networks is studied in terms of rates of variable-basis approximation. Sets of functions, for which the errors in approximation by neural networks with n hidden units converge to zero geometrically fast with increasing number n, are described. However, the geometric speed of convergence depends on parameters, which are specific for each function to be appr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Foundations of Computational Mathematics

سال: 2022

ISSN: ['1615-3383', '1615-3375']

DOI: https://doi.org/10.1007/s10208-022-09595-3